Comparing Structure Learning Methods for RKHS Embeddings of Protein Structures
ثبت نشده
چکیده
Non-parametric graphical models, embedded in reproducing kernel Hilbert spaces, provide a framework to model multi-modal and arbitrary multi-variate distributions, which are essential when modeling complex protein structures. Non-parametric belief propagation requires the structure of the graphical model to be known a priori. Currently there are nonparametric structure learning algorithms available for tree structures, but a tree structure is not reasonable when modeling protein molecular networks. In this paper, we compare parametric neighborhood selection structure learning method, which is capable of recovering true general graph structures, to the non-parametric tree learning method, for the particular task of modeling protein structures represented as sequences of torsion angles. Our experiments, performed on molecular dynamics simulation data of Engrailed Homeodomain protein, show that neighborhood selection method outperforms nonparametric tree structure learning method. We also find that non-parametric models outperform the semi-parametric non-paranormal model as well as the parametric sparse Gaussian graphical model, when an appropriate kernel is used.
منابع مشابه
Conditional mean embeddings as regressors
We demonstrate an equivalence between reproducing kernel Hilbert space (RKHS) embeddings of conditional distributions and vector-valued regressors. This connection introduces a natural regularized loss function which the RKHS embeddings minimise, providing an intuitive understanding of the embeddings and a justification for their use. Furthermore, the equivalence allows the application of vecto...
متن کاملOnline Relative Entropy Policy Search using Reproducing Kernel Hilbert Space Embeddings
Kernel methods have been successfully applied to reinforcement learning problems to address some challenges such as high dimensional and continuous states, value function approximation and state transition probability modeling. In this paper, we develop an online policy search algorithm based on a recent state-of-the-art algorithm REPS-RKHS that uses conditional kernel embeddings. Our online al...
متن کاملKernel Choice and Classifiability for RKHS Embeddings of Probability Distributions
Embeddings of probability measures into reproducing kernel Hilbert spaces have been proposed as a straightforward and practical means of representing and comparing probabilities. In particular, the distance between embeddings (the maximum mean discrepancy, or MMD) has several key advantages over many classical metrics on distributions, namely easy computability, fast convergence and low bias of...
متن کاملModelling transition dynamics in MDPs with RKHS embeddings
We propose a new, nonparametric approach to learning and representing transition dynamics in Markov decision processes (MDPs), which can be combined easily with dynamic programming methods for policy optimisation and value estimation. This approach makes use of a recently developed representation of conditional distributions as embeddings in a reproducing kernel Hilbert space (RKHS). Such repre...
متن کاملHypothesis testing using pairwise distances and associated kernels
We provide a unifying framework linking two classes of statistics used in two-sample and independence testing: on the one hand, the energy distances and distance covariances from the statistics literature; on the other, distances between embeddings of distributions to reproducing kernel Hilbert spaces (RKHS), as established in machine learning. The equivalence holds when energy distances are co...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2012